Chapter 5. Cognitive Dissonance & Operator Error From the Rogovin Report

The below quote is taken directly from the Rogovin Report (Specific Conclusion 6)

 

“6. In reviewing the incident at Davis Besse, one can see several indications that the PORV was open and that the reactor coolant system inventory was decreasing. With the benefit of hindsight the operators' actions appear to include a number of errors. These errors include stopping the high pressure injection pumps as the reactor coolant system approached saturation conditions and the delay in closing the PORV block valve.

 

Study of the behavior of highly trained people under emergency conditions suggests that such people rarely make simple blunders in the operation of systems. Such people typically are highly disciplined; trained to follow procedures carefully; trained to avoid improvisation; and intensely aware of rules and constraints. Compared with the average person, they rarely make tactical errors in the sense of accidentally turning the wrong knob. Nevertheless, such trained people sometimes do make errors in emergencies. To distinguish these from the ordinary kind of errors, we may call these "strategic" errors. In an emergency such people recognize that something is wrong and that some action must be taken. They conceive a model or scenario for what is happening. They follow procedures or reaction strategy which they believe is applicable to the scenario. Studies also show that once a scenario is conceived and a reaction strategy undertaken, there is a tendency not to seek or perceive additional data which contradict the original scenario. There is a psychological phenomenon called "cognitive dissonance" which makes the mind tend to reject data in conflict with the original hypothesis.

 

After an incorrect scenario is conceived, an entire pattern of actions can be taken which in retrospect are blunders. This phenomenon can be seen to a limited extent during the September 24, 1977 incident at Davis Besse, and to a much greater extent during the TMI accident. However, it does not appear that this phenomenon has ever been addressed in the design or licensing of nuclear power plants. The implications of this phenomenon are considerable since it implies that any sequence of actions by an operator, no matter how ill advised it may seem to a dispassionate observer, (i.e., the designer) may in fact be a creditable event that must be considered in accident analyses.”

--------------------------------------------------------------------------------------------------------------------------------------------

 

When I first read this conclusion Statement I didn’t even know what “cognitive dissonance” meant, so I looked it up; below are a couple of useful typical explanations. After reading these, and some of Specific Conclusion 6, I kind'a, sort'a had the idea; it’s a noun, but it describes a feeling. I note that only a “shrink” would be so arrogant as to tell me what I was feeling, especially after the fact; they had me testifying under oath and could have asked. But the definitions are also a bit confusing. I get the idea that it affects behavior, but it seems to imply (especially in Rogovin) that you don’t modify your behavior, then again, the definitions say you do (or can) modify behavior, attitudes or beliefs to make the “dissonance” go away. So they seem to get you coming or going on the definition; but the conclusion write-up seems to be focusing on the case where information is rejected in order to maintain a “conceived” model.

--------------------------------------------------------------------------------------------------------------------------------------------

 

Cognitive Dissonance

Mental conflict that occurs when beliefs or assumptions are contradicted by new information. The concept was introduced by the psychologist Leon Festinger (1919–89) in the late 1950s. He and later researchers showed that, when confronted with challenging new information, most people seek to preserve their current understanding of the world by rejecting, explaining away, or avoiding the new information or by convincing themselves that no conflict really exists. Cognitive dissonance is nonetheless considered an explanation for attitude change.

 

Cognitive dissonance refers to a situation involving conflicting attitudes, beliefs or behaviors.

This produces a feeling of discomfort leading to an alteration in one of the attitudes, beliefs or behaviors to reduce the discomfort and restore balance etc.

Festinger's (1957) cognitive dissonance theory suggests that we have an inner drive to hold all our attitudes and beliefs in harmony and avoid disharmony (or dissonance).

 

This Conclusion Statement is quite literally so full of crap on so many levels, with respect to my actions on the Davis Besse event, that it is not worth my effort to address them all. But I will hit the high points relative to Cognitive Dissonance, turning off High Pressure Injection, and the PORV. I will further add that I don’t care what these “shrinks” concluded after studying data in a Table Top environment, for days on end, then “imagining” what I did or did not think, or why I did do or did not do something nearly two years after the fact. I am the expert, I was there, I know why I did the things I did, I know what I was thinking. My message to the writers/developers of this conclusion statement is simply I am right by definition, you are wrong. I even understand how you got to your conclusion; you are suffering a classic, textbook case of Cognitive Dissonance as illustrated by your total rejection of the “new information” I provided in my sworn testimony to your Commission. This happened because of the Institutional Arrogance which does not allow you to accept the fact that “Operator Error” was not the root cause of your problem. But since that was starting point for your internal investigation and discussion you forced that conclusion out of your investigation1. I will also point out the obvious fact that if you, and not your “shrinks” had spent one-tenth of the time understanding the Davis Besse event immediately after it happened, as you spent inventing how I think two years later after TMI, TMI would not have occurred.

 

The first point that I want to emphasis relative to Cognitive Dissonance is the “conceived model, belief, current understanding”, etc. The Behaviorists who study “the behavior of highly trained people under emergency conditions” for the most part are correct. But it seems to me inherent in this process that they have assumed the “conceived model” has been developed by a person “in their head.” This was not the case here, and in fact it is not even conceivable to imagine that at least a dozen different people (5 at Davis Besse and several more at TMI) all independently conceived a wrong model. And that’s because they didn’t; they simply used the one that had been “inserted” in their heads during their training and further reinforced by their Simulator training. Quite simply the model we were taught for a leak in the Pressurizer steam space said the Universe revolved around the Earth. Your statement “They conceive a model or scenario for what is happening“ represents the perfect example of why you people never “got it.” This statement is your starting point, and since you were unable (or unwilling) to accept any other premise the result is GIGO.

 

The second point I want to emphasize relative to Cognitive Dissonance is that your conclusion states “Studies also show that once a scenario is conceived and a reaction strategy undertaken, there is a tendency not to seek or perceive additional data which contradict the original scenario.” My recall of my testimony is that I specifically told you I realized something was going on that we had not been told about, (thus I hadn’t “locked in” on any scenario). I further stated I knew I was going to have to figure out a “reaction strategy”, because nothing I had available was going to work (because everything I had available to me at the time was only applicable to the “conceived model” we had been taught; this was obviously a new model, and in fact I personally was not referring to any EOP). I further remember being aware I was missing something so I was trying “to seek or perceive additional data.” It seems to me my behavior was directly the opposite of what defines Cognitive Dissonance, yet you accuse me of Cognitive Dissonance and Operator Errors; why?

 

The third point I want to emphasize is about turning off High Pressure Injection. It is common knowledge how the old single event EOPs (including major Caution statements) were expected to be implemented by Operators, which by the way, is the same way we were trained to implement them. The EOP Immediate Operator Action steps had to be committed to memory; this also applied to several Caution statements. When called upon these steps were expected to be implemented by an Operator in what I would describe as a “Robotic Nature” without question. The false emphasis placed on never going solid or letting High Pressure Injection fill the Pressurizer full was expected to be implemented in the same Robotic Nature; reinforced by the fact you would “fail” a Simulator exam if you allowed it to happen. Thus you made this conclusion:

 

“In reviewing the incident at Davis Besse, one can see several indications that the PORV was open and that the reactor coolant system inventory was decreasing. With the benefit of hindsight the operators' actions appear to include a number of errors. These errors include stopping the high pressure injection pumps as the reactor coolant system approached saturation conditions and the delay in closing the PORV block valve. “

 

This conclusion statement is basically, in my opinion, a Material False Statement. We had one indicator for “reactor coolant system inventory”, the Pressurizer level, not “several” as you state. And it was increasing not decreasing as you state at the time we shutoff High Pressure Injection. We had no “saturation conditions” indicator as you imply (although since TMI one has been implemented). Since you were aware of all this at the time you wrote your Conclusion Statement, it is a Material False Statement. But it was not my Operator Error to do it; either at the time I did it, an hour later after I understood what had happened, at the time TMI did it, at the time I gave sworn testimony to your Commission, today as I write this, or tomorrow. Period. (Do I need to add “watch my lips”?)

 

Also your conclusion: “With the benefit of hindsight the operators' actions appear to include a number of errors.” This conclusion contains an over abundance of typos and misspellings. I’m sure you meant to write “With the failure of NRC oversight of the Davis Besse event the NRC inactions appear to include a number of errors that contributed to the root cause of the TMI accident.”

 

The fourth point I want to emphasis is a few words about the PORV. You conclude there were “several” indications the PORV was open and a “delay” in closing the PORV block valve. From Webster’s:

Sev·er·al: adjective \'sev-rel\

: more than two but not very many

: different and separate

 

Before I discuss “indications” I’ll remind you that Instruments and Controls that I need to cope with Design Basis Events are required to be Safety Grade and actually installed in my Control Room, per General Design Criteria 13. So it is quite possible you are referring to indications you thought were not needed to cope with this event when you certified the design.

 

So we can continue this discussion, please specifically provide at least three, different and separate, indications actually installed in my control room at the time of the event, which unambiguously show the PORV is failed open. Also please provide the exact point on the event timeline which those indications occurred. Hint: Cryptic explanations of adiabatic expansions and a Mollier Diagram “need not apply” as we had no Mollier meter and it also requires at least looking at the PORV tailpipe temperature; which I already testified I did not. (I will admit such mumbo jumbo might be useful if someone wants to pass a regulation requiring a Shift Technical Advisor (STA) position added to the control Room because they think Operators are stupid, but I don’t/didn’t need your STA for the reasons you think I needed one) And my reason for not looking at the PORV tailpipe temperature still remains the same some thirty five years later; I knew it blew; the tailpipe would be hot; the position indicator said it was closed. If someone asked me if a just turned off stove burner was really off, I’d look at the burner control switch and respond, but you’d probably touch the stove burner; and what exactly did that get you? As to the “delay”, please provide the time on the event timeline so I can compare that to what I was doing at that time. I’d say I closed the block valve “soon enough.” To address the point, again, that the Quench Tank Rupture Disk failed I’d say “so what”? By my actual plant experience I had seen it fail after a successful PORV lift/re-close cycle. So where is “new information”? For all I knew the Quench Tank either had a design deficiency or it was never designed to accommodate heat removal from a PORV lift with its cooling water system and back-pressure regulator vent line closed; both of which happened with the SFAS actuation, both of which I was well aware of. Quite honestly I probably didn’t give ten seconds of thought to the whole thing at the time the rupture disk blew, and within about a minute-and-a-half I was more concerned about why the Pressurizer level was rapidly increasing, to off-scale, when we were no longer adding water to the RCS.

 

As a result of circumstances you absolutely fail to acknowledge with your Conclusion Statement, I was put in a box, totally outside of the Design Basis Understanding of PWR technology for a SBLOCA in the steam space of the Pressurizer. Your Conclusion Statement makes it sound like I “conceived a model” and proceeded. Nope, I’m actually not bright enough to “conceive the model”, I simply used the one I was given; as did the TMI Operators. The fact that it was due to a failed PORV is not even relevant to a correct understanding of the event, a point you didn’t seem to grasp (among others), but in fact was “lucky” as a PORV is isolable and some locations in the steam space are not, but those other non-isolable locations would have responded exactly the same way. By five minutes into the event I had responded exactly as the bogus model of the event had expected me to, and my training had reinforced. By ten to twelve minutes into the event, looking at what the plant was telling me, I was aware something was wrong with the “model”, even to the point that within a short time I announced to the Control Room why the Pressurizer was full. You people really think this whole thing should have been simple to diagnose and accept. But it was as foreign to me in those several minutes as the idea the Earth is not the center of the Universe. I think I did what is the first rule of Emergency Procedures for airplane pilots; fly the airplane first, while you figure it out, which I did. And yet in spite of all that, you accuse me of Cognitive Dissonance and Operator Errors.

 

Footnote 1. There is evidence to support the Kemeny Commission started out on the wrong foot in this regard also. See March 3 - 2004-3-3.pdf, Memories of the Kemeny Commission, by Ronald M. Atchison, published in Nuclear News, March, 2004.

 

Footnote 2. A quote from the Rogovin Report specific conclusions:

"1. The incident that occurred at Davis Besse is almost an exact copy of the accident that occurred at TMI. The reasons that Davis Besse did not sustain the severe core damage that resulted at TMI are that the Davis Besse plant had been operating at a very low power level and had a very low power history, and the operators at Davis Besse were able to identify and isolate the open PORV in 20 minutes as opposed to 2 hours at TMI. If it had not been for these fortuitous conditions, it is very likely that the incident at Davis Besse would have been as severe as the subsequent accident at TMI-2."

I think these guys at least owe me a T-Shirt:

"Despite the fact that everybody who touched

this event couldn't seem to find their own ass

with either hand, this guy gets our

Blind Squirrel Award."